Sensitivity of feedforward neural networks to weight errors

نویسندگان

  • Maryhelen Stevenson
  • Rodney Winter
  • Bernard Widrow
چکیده

An analysis is made of the sensitivity of feedforward layered networks of Adaline elements (threshold logic units) to weight errors. An approximation is derived which expresses the probability of error for an output neuron of a large network (a network with many neurons per layer) as a function of the percentage change in the weights. As would be expected, the probability of error increases with the number of layers in the network and with the percentage change in the weights. The probability of error is essentially independent of the number of weights per neuron and of the number of neurons per layer, as long as these numbers are large (on the order of 100 or more).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Empirical Study of Least Sensitive FFANN for Weight-Stuck-at Zero Fault

An important consideration for neural hardware is its sensitivity to input and weight errors. In this paper, an empirical study is performed to analyze the sensitivity of feedforward neural networks for Gaussian noise to input and weight. 30 numbers of FFANN is taken for four different classification tasks. Least sensitive network for input and weight error is chosen for further study of fault ...

متن کامل

A New Weight Initialization Method Using Cauchy’s Inequality Based on Sensitivity Analysis

In this paper, an efficient weight initialization method is proposed using Cauchy’s inequality based on sensitivity analysis to improve the convergence speed in single hidden layer feedforward neural networks. The proposed method ensures that the outputs of hidden neurons are in the active region which increases the rate of convergence. Also the weights are learned by minimizing the sum of squa...

متن کامل

Weight Sensitivity and Interdependence 1 Characterizing Network Complexity and Classification Efficiency by the Ratio of Weight Interdependence to Sensitivity

We extend previous research on digital filter structures and parameter sensitivity to the relationship between the nature of hidden-unit activation function, weight sensitivity and interdependence, and classification learning in neural networks. Weight sensitivity indicates the extent of variations in a network's output when reacting to small perturbations in its weights; whereas weight interde...

متن کامل

بررسی کارایی روش‌های مختلف هوش مصنوعی و روش آماری در برآورد میزان رواناب (مطالعه موردی: حوزه شهید نوری کاخک گناباد)

Rainfall-runoff models are used in the field of hydrology and runoff estimation for many years, but despite existing numerous models, the regular release of new models shows that there is still not a model that can provide sophisticated estimations with high accuracy and performance. In order to achieve the best results, modeling and identification of factors affecting the output of the model i...

متن کامل

The Method of Steepest Descent for Feedforward Artificial Neural Networks

In this paper, we implement the method of Steepest Descent in single and multilayer feedforward artificial neural networks. In all previous works, all the update weight equations for single or multilayer feedforward artificial neural networks has been calculated by choosing a single activation function for various processing unit in the network. We, at first, calculate the total error function ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 1 1  شماره 

صفحات  -

تاریخ انتشار 1990